skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Najafi, M"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available May 4, 2026
  2. Low-cost and hardware-efficient design of trigonometric functions is challenging. Stochastic computing (SC), an emerging computing model processing random bit-streams, offers promising solutions for this problem. The existing implementations, however, often overlook the importance of the data converters necessary to generate the needed bit-streams. While recent advancements in SC bit-stream generators focus on basic arithmetic operations such as multiplication and addition, energy-efficient SC design of non-linear functions demands attention to both the computation circuit and the bit-stream generator. This work introduces TriSC, a novel approach for SC-based design of trigonometric functions enjoying state-of-the-art (SOTA) quasi-random bit-streams. Unlike SOTA SC designs of trigonometric functions that heavily rely on delay elements to decorrelate bit-streams, our approach avoids delay elements while improving the accuracy of the results. TriSC yields significant energy savings of up to 92% compared to SOTA. As two novel use cases studied for the first time in SC literature, we employ the proposed design for 2D image transformation and forward kinematics of a robotic arm, two computation-intensive applications demanding low-cost trigonometric designs. 
    more » « less
    Free, publicly-accessible full text available November 7, 2025
  3. Hyperdimensional computing (HDC) is an emerging computing paradigm with significant promise for efficient and robust learning. In HDC, objects are encoded with high-dimensional vector symbolic sequences called hypervectors. The quality of hypervectors, defined by their distribution and independence, directly impacts the performance of HDC systems. Despite a large body of work on the processing parts of HDC systems, little to no attention has been paid to data encoding and the quality of hypervectors. Most prior studies have generated hypervectors using inherent random functions, such as MATLAB’s or Python’s random function. This work introduces an optimization technique for generating hypervectors by employing quasi-random sequences. These sequences have recently demonstrated their effectiveness in achieving accurate and low-discrepancy data encoding in stochastic computing systems. The study outlines the optimization steps for utilizing Sobol sequences to produce highquality hypervectors in HDC systems. An optimization algorithm is proposed to select the most suitable Sobol sequences via indexes for generating minimally correlated hypervectors, particularly in applications related to symbol-oriented architectures. The performance of the proposed technique is evaluated in comparison to two traditional approaches of generating hypervectors based on linear-feedback shift registers and MATLAB random functions. The evaluation is conducted for three applications: (i) language, (ii) headline, and (iii) medical image classification. Our experimental results demonstrate accuracy improvements of up to 10.79%, depending on the vector size. Additionally, the proposed encoding hardware exhibits reduced energy consumption and a superior area-delay product. 
    more » « less
  4. Precise seizure identification plays a vital role in understanding cortical connectivity and informing treatment decisions. Yet, the manual diagnostic methods for epileptic seizures are both labor-intensive and highly specialized. In this study, we propose a Hyperdimensional Computing (HDC) classifier for accurate and efficient multi-type seizure classification. Despite previous seizure analysis efforts using HDC being limited to binary detection (seizure or no seizure), our work breaks new ground by utilizing HDC to classify seizures into multiple distinct types. HDC offers significant advantages, such as lower memory requirements, a reduced hardware footprint for wearable devices, and decreased computational complexity. Due to these attributes, HDC can be an alternative to traditional machine learning methods, making it a practical and efficient solution, particularly in resource-limited scenarios or applications involving wearable devices. We evaluated the proposed technique on the latest version of TUH EEG Seizure Corpus (TUSZ) dataset and the evaluation result demonstrate noteworthy performance, achieving a weighted F1 score of 94.6%. This outcome is in line with, or even exceeds, the performance achieved by the state-ofthe-art traditional machine learning methods. 
    more » « less
  5. Antenna designs play a crucial role in wireless communication systems where high-performance specifications are greatly required. The radiation pattern (RP) specification in both the E-plane and H-plane is important, as it connects the antenna gain along a given direction. This performance is calculated in the entire bandwidth for various frequencies and is time-consuming. To speed up these simulations, a new approach with the help of a generative adversarial network (GAN) is presented, leading to the prediction of the expected radiation pattern outcomes for the determined frequencies. This method is verified for two previously optimized antennas, one operating between 8.8-10.1 GHz and the other working in the 11.3-13.16 G Hz band. The experimental simulation results prove that the mean absolute error is less than 0.35, which yields suitable accuracy for RP predictions. 
    more » « less
  6. Word-aware sentiment analysis has posed a significant challenge over the past decade. Despite the considerable efforts of recent language models, achieving a lightweight representation suitable for deployment on resource-constrained edge devices remains a crucial concern. This study proposes a novel solution by merging two emerging paradigms, the Word2Vec language model and Hyperdimensional Computing, and introduces an innovative framework named Word2HyperVec. Our framework prioritizes model size and facilitates low-power processing during inference by incorporating embeddings into a binary space. Our solution demonstrates significant advantages, consuming only 2.2 W, up to 1.81 × more efficient than alternative learning models such as support vector machines, random forest, and multi-layer perceptron. 
    more » « less
  7. Word-aware sentiment analysis has posed a significant challenge over the past decade. Despite the considerable efforts of recent language models, achieving a lightweight representation suitable for deployment on resource-constrained edge devices remains a crucial concern. This study proposes a novel solution by merging two emerging paradigms, the Word2Vec language model and Hyperdimensional Computing, and introduces an innovative framework named Word2HyperVec. Our framework prioritizes model size and facilitates low-power processing during inference by incorporating embeddings into a binary space. Our solution demonstrates significant advantages, consuming only 2.2 W, up to 1.81 × more efficient than alternative learning models such as support vector machines, random forest, and multi-layer perceptron. 
    more » « less
  8. Hyperdimensional computing (HDC) is a novel computational paradigm that operates on long-dimensional vectors known as hypervectors. The hypervectors are constructed as long bit-streams and form the basic building blocks of HDC systems. In HDC, hypervectors are generated from scalar values without considering bit significance. HDC is efficient and robust for various data processing applications, especially computer vision tasks. To construct HDC models for vision applications, the current state-of-the-art practice utilizes two parameters for data encoding: pixel intensity and pixel position. However, the intensity and position information embedded in high-dimensional vectors are generally not generated dynamically in the HDC models. Consequently, the optimal design of hypervectors with high model accuracy requires powerful computing platforms for training. A more efficient approach is to generate hypervectors dynamically during the training phase. To this aim, this work uses low-discrepancy sequences to generate intensity hypervectors, while avoiding position hypervectors. Doing so eliminates the multiplication step in vector encoding, resulting in a power-efficient HDC system. For the first time in the literature, our proposed approach employs lightweight vector generators utilizing unary bit-streams for efficient encoding of data instead of using conventional comparator-based generators. 
    more » « less